We prove a quantitative functional central limit theorem for one-hidden-layer neural networks with generic activation function. Our rates of convergence depend heavily on the smoothness of the activation function, and they range from logarithmic for nondifferentiable nonlinearities such as the ReLu to root n for highly regular activations. Our main tools are based on functional versions of the Stein-Malliavin method; in particular, we rely on a quantitative functional central limit theorem which has been recently established by Bourguin and Campese [Electron. J. Probab. 25 (2020), 150].
A quantitative functional central limit theorem for shallow neural networks / Cammarota, V.; Marinucci, D.; Salvi, M.; Vigogna, S.. - In: MODERN STOCHASTICS: THEORY AND APPLICATIONS. - ISSN 2351-6054. - 11:1(2024), pp. 85-108. [10.15559/23-VMSTA238]
A quantitative functional central limit theorem for shallow neural networks
Cammarota V.Membro del Collaboration Group
;
2024
Abstract
We prove a quantitative functional central limit theorem for one-hidden-layer neural networks with generic activation function. Our rates of convergence depend heavily on the smoothness of the activation function, and they range from logarithmic for nondifferentiable nonlinearities such as the ReLu to root n for highly regular activations. Our main tools are based on functional versions of the Stein-Malliavin method; in particular, we rely on a quantitative functional central limit theorem which has been recently established by Bourguin and Campese [Electron. J. Probab. 25 (2020), 150].| File | Dimensione | Formato | |
|---|---|---|---|
|
Cammarota_quantitative-functional-central_2024 .pdf
accesso aperto
Tipologia:
Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza:
Creative commons
Dimensione
235.16 kB
Formato
Adobe PDF
|
235.16 kB | Adobe PDF |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


